Goto

Collaborating Authors

 tensorflow and keras


AI-ANNE: (A) (N)eural (N)et for (E)xploration: Transferring Deep Learning Models onto Microcontrollers and Embedded Systems

Klinkhammer, Dennis

arXiv.org Artificial Intelligence

This working paper explores the integration of neural networks onto resource-constrained embedded systems like a Raspberry Pi Pico / Raspberry Pi Pico 2. A TinyML aproach transfers neural networks directly on these microcontrollers, enabling real-time, low-latency, and energy-efficient inference while maintaining data privacy. Therefore, AI-ANNE: (A) (N)eural (N)et for (E)xploration will be presented, which facilitates the transfer of pre-trained models from high-performance platforms like TensorFlow and Keras onto microcontrollers, using a lightweight programming language like MicroPython. This approach demonstrates how neural network architectures, such as neurons, layers, density and activation functions can be implemented in MicroPython in order to deal with the computational limitations of embedded systems. Based on the Raspberry Pi Pico / Raspberry Pi Pico 2, two different neural networks on microcontrollers are presented for an example of data classification. As an further application example, such a microcontroller can be used for condition monitoring, where immediate corrective measures are triggered on the basis of sensor data. Overall, this working paper presents a very easy-to-implement way of using neural networks on energy-efficient devices such as microcontrollers. This makes AI-ANNE: (A) (N)eural (N)et for (E)xploration not only suited for practical use, but also as an educational tool with clear insights into how neural networks operate.


Building a Neural Network using Keras and TensorFlow in Python

#artificialintelligence

Python can make use of artificial intelligence through various libraries and frameworks such as TensorFlow, Keras, and scikit-learn. For example, one can use TensorFlow and Keras to build a neural network for image classification. The model can be trained on a dataset of images, and then used to predict the class of new images. This is a simple example, but it demonstrates how easy it is to use Python with TensorFlow and Keras to train a neural network and make predictions with artificial intelligence. One advanced example of using Python and artificial intelligence is to train a deep learning model for natural language processing tasks, such as language translation.


How to Implement Multi-Head Attention From Scratch in TensorFlow and Keras

#artificialintelligence

We have already familiarised ourselves with the theory behind the Transformer model and its attention mechanism, and we have already started our journey of implementing a complete model by seeing how to implement the scaled-dot product attention. We shall now progress one step further into our journey by encapsulating the scaled-dot product attention into a multi-head attention mechanism, of which it is a core component. Our end goal remains the application of the complete model to Natural Language Processing (NLP). In this tutorial, you will discover how to implement multi-head attention from scratch in TensorFlow and Keras. How to Implement Multi-Head Attention From Scratch in TensorFlow and Keras Photo by Everaldo Coelho, some rights reserved.


Implementing the Transformer Decoder From Scratch in TensorFlow and Keras

#artificialintelligence

There are many similarities between the Transformer encoder and decoder, such as in their implementation of multi-head attention, layer normalization and a fully connected feed-forward network as their final sub-layer. Having implemented the Transformer encoder, we will now proceed to apply our knowledge in implementing the Transformer decoder, as a further step towards implementing the complete Transformer model. Our end goal remains the application of the complete model to Natural Language Processing (NLP). In this tutorial, you will discover how to implement the Transformer decoder from scratch in TensorFlow and Keras. Implementing the Transformer Decoder From Scratch in TensorFlow and Keras Photo by François Kaiser, some rights reserved.


gMLP: What it is and how to use it in practice with Tensorflow and Keras?

#artificialintelligence

It demonstrates near state-of-the-art results on NLP and computer vision tasks but using a lot less trainable parameters than corresponding Transformer models. The most important component of state-of-the art Transformer architectures is the attention mechanism. It is used to find what relationships between data items are important for the neural network. To spot the innovation of the gMLP, let's first understand what the already mentioned terms static parameterization and spatial projections mean. As described above, attention mechanisms change dynamically depending on the inputs.


Hands-On Transfer Learning with Python: Implement advanced deep learning and neural network models using TensorFlow and Keras: Sarkar, Dipanjan, Bali, Raghav, Ghosh, Tamoghna: 9781788831307: Amazon.com: Books

#artificialintelligence

Dipanjan Sarkar is a Data Scientist at Intel, on a mission to make the world more connected and productive. He primarily works on data science, analytics, business intelligence, application development, and building large-scale intelligent systems. He holds a master of technology degree in Information Technology with specializations in Data Science and Software Engineering from the International Institute of Information Technology, Bangalore. He is also an avid supporter of self-learning, especially Massive Open Online Courses and also holds a Data Science Specialization from Johns Hopkins University on Coursera. Dipanjan has been an analytics practitioner for several years now, specializing in statistical, predictive, and text analytics.


Generative Adversarial Networks Projects: Build next-generation generative models using TensorFlow and Keras: Ahirwar, Kailash: 9781789136678: Amazon.com: Books

#artificialintelligence

Kailash Ahirwar is a machine learning and deep learning enthusiast. He has worked in many areas of Artificial Intelligence ranging from Natural Language Processing(NLP), Computer Vision to Generative Modeling using GANs. He co-founded Mate Labs and currently leading the technical team of Mate Labs. He uses GANs for building different models such as turning painting into photos and enhancing the resolution of images etc. He is super optimistic about Artificial General Intelligence and believes that AI is going to be the workhorse of human evolution.


GitHub - Nyandwi/ModernConvNets: Revisions and implementations of modern Convolutional Neural Networks architectures in TensorFlow and Keras

#artificialintelligence

I had a joy learning, revising, and implementing CNN architectures. While going through the materials in this repository, I hope you will enjoy them as much as I did! For any error, suggestion, or simply anything, you can reach out through email, Twitter or LinkedIn.


Grape Leaves Disease Detection

#artificialintelligence

We've trained our Deep Learning model using TensorFlow and Keras for detecting grape leaf disease and got an accuracy of 93%.


Tensorflow and Keras For Neural Networks and Deep Learning - CouponED

#artificialintelligence

It is a practical, hands-on course, i.e. we will spend some time dealing with some of the theoretical concepts related to data science. However, majority of the course will focus on implementing different techniques on real data and interpret the results.. After each video you will learn a new concept or technique which you may apply to your own projects!